The Largest Senate Judiciary Committee Audience Is on Capitol Hill—and at Home—Today
from Net Politics and Digital and Cyberspace Policy Program
from Net Politics and Digital and Cyberspace Policy Program

The Largest Senate Judiciary Committee Audience Is on Capitol Hill—and at Home—Today

Meta's CEO Mark Zuckerberg returns to his seat after standing and facing the audience while he testified during the Senate Judiciary Committee hearing on online child sexual exploitation at the U.S. Capitol in Washington, D.C. on January 31, 2024
Meta's CEO Mark Zuckerberg returns to his seat after standing and facing the audience while he testified during the Senate Judiciary Committee hearing on online child sexual exploitation at the U.S. Capitol in Washington, D.C. on January 31, 2024 Evelyn Hockstein/Reuters

Today's Senate hearing on big tech and the child exploitation crisis should remind the public of Section 230's provision on parental controls, and the real-world analogies to how social media platforms operate.

January 31, 2024 1:44 pm (EST)

Meta's CEO Mark Zuckerberg returns to his seat after standing and facing the audience while he testified during the Senate Judiciary Committee hearing on online child sexual exploitation at the U.S. Capitol in Washington, D.C. on January 31, 2024
Meta's CEO Mark Zuckerberg returns to his seat after standing and facing the audience while he testified during the Senate Judiciary Committee hearing on online child sexual exploitation at the U.S. Capitol in Washington, D.C. on January 31, 2024 Evelyn Hockstein/Reuters
Post
Blog posts represent the views of CFR fellows and staff and not those of CFR, which takes no institutional positions.

Portions of this blog post were published on LinkedIn and can be found here.

Today, Senator Dick Durbin (D-IL), Chair of the Senate Judiciary Committee, welcomed what he deemed the “‘the largest [audience] I’ve seen in this [Senate] room’” for the “Big Tech and the Online Child Sexual Exploitation Crisis” hearing.

More on:

Technology and Innovation

Social Media

U.S. Congress

Five CEOs—from Discord, Meta, Snap, TikTok, and X (formerly Twitter)—are before the Judiciary Committee right now, and before an audience that includes “some parents who say that Instagram contributed to their children’s suicide or exploitation.” And no doubt the audience outside of the hearing room includes countless parents across the country who are suffering from similar losses or living with the daily fear of similar losses, as well as countless parents whose children and teens benefit from the creative, social, and other positive experiences that these and other tech platforms and services may, but don’t always, provide.

As a legal scholar who specializes in youth and digital citizenship, including online safety, and as a parent myself, here are a couple of things that I’m keeping a close eye on during, and after, this high-stakes hearing:


Let's talk about (the rest of) Section 230: There's a lot of talk in Washington, D.C., Silicon Valley, and other centers of law and commerce—including in today’s hearing—about whether we'll see Section 230 repeal or reform, with a focus on Section 230’s strong shield against tech platform liability for third-party content. But there is already one (and only one) obligation placed on tech companies under Section 230, and it's about what tech companies owe to parents and families.

We don't talk about this part of Section 230 enough, including in the hearing happening now. Here it is:

A provider of interactive computer service shall, at the time of entering an agreement with a customer for the provision of interactive computer service and in a manner deemed appropriate by the provider, notify such customer that parental control protections... are available that may assist the customer in limiting access to material that is harmful to minors.

This is a very weak “obligation,” if it can even be called one, but it's better than nothing. It is current federal law, and we can and should make it mean more than it does. I am looking for Senators’ questions—and the post-hearing discussions and actions that will follow—to remind tech CEOs that they do already have an “obligation” in federal law about parental control protections—and to push them to fulfill this obligation in ways that honor all the policies in Section 230, including “encourag[ing] the development of technologies which maximize user control over what information is received by individuals, families, and schools who use the Internet.”

More on:

Technology and Innovation

Social Media

U.S. Congress

Let's go to the (pretend) mall: Brick and mortar analogies to digital platforms can be helpful thought exercises—not because brick and mortar places are ever perfect comparisons for digital realms, but because thinking about old school brick and mortar helps us think about whether we would tolerate situations now occurring in the digital realm if they were occurring instead at a (hypothetical) mall in our town. With the bipartisan federal lawmaker interest in protecting youth online safety, as well as the growing state legislative and other action across blue and red divides, grounding our nationwide conversation in such familiar brick and mortar hypotheticals can help us have a shared conversation that identifies our shared values and expectations for our children’s safety.

How would we feel if we were dropping our kids at the mall with their friends and, with some degree of frequency (the CEOs can presumably tell our Senators what those degrees in fact are!), our kids were followed around by people holding signs promoting self-harm and other violence; by people wearing masks and pretending to be their friends (but were actually strangers) asking them to take naked pics of themselves—then those pictures being posted on billboards at that mall and every other mall in the world, in perpetuity; and by mall staff who kept directing our kids to stay longer and shop more and more and more? Let's imagine this mall also has displays, put on both by the mall and by other visitors, of amazing art, robust political speech, and opportunities for our kids to make new friends their own age. How do we feel about the total experience of this mall for our kids? Is this the mall we'd welcome to our town?

Let’s give our kids and teens the Taylor Swift treatment: recently, megastar Taylor Swift was the victim of non-consensual, AI-generated pornographic image-sharing across social media platforms. X responded by temporarily “blocking users from searching for her” as a “temporary action done with ‘abundance of caution.’” Swift seems to have received platinum-level service in platforms’ response to this attack on her. Platforms are to be commended for offering Swift this service—but platforms should also be pressed or, better yet, required (through future legislation, with necessary free speech and privacy protections) to shift their standard level of service—the one available now to youth and families trying to address such attacks on themselves or their children—to the same “blank space” protective treatment that Swift received to protect youth safety, privacy, and lives.

 

Leah Plunkett is the author of Sharenthood: Why We Should Think Before We Talk About Our Kids Online and a member of the faculty at Harvard Law School.

Creative Commons
Creative Commons: Some rights reserved.
Close
This work is licensed under Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) License.
View License Detail
Close